On the Complexity of One-class SVM for Multiple Instance Learning

نویسندگان

  • Zhen Hu
  • Zhuyin Xue
چکیده

In traditional multiple instance learning (MIL), both positive and negative bags are required to learn a prediction function. However, a high human cost is needed to know the label of each bag—positive or negative. Only positive bags contain our focus (positive instances) while negative bags consist of noise or background (negative instances). So we do not expect to spend too much to label the negative bags. Contrary to our expectation, nearly all existing MIL methods require enough negative bags besides positive ones. In this paper we propose an algorithm called “Positive Multiple Instance” (PMI), which learns a classifier given only a set of positive bags. So the annotation of negative bags becomes unnecessary in our method. PMI is constructed based on the assumption that the unknown positive instances in positive bags be similar each other and constitute one compact cluster in feature space and the negative instances locate outside this cluster. The experimental results demonstrate that PMI achieves the performances close to or a little worse than those of the traditional MIL algorithms on benchmark and real data sets. However, the number of training bags in PMI is reduced significantly compared with traditional MIL algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

CS395T Data Mining Project report One-class SVM formulations for Multiple Instance learning

Multiple Instance learning (MIL) considers a particular form of weak supervision in which the learner is given a set of positive bags and negative bags. Positive bags are sets of instances containing atleast one positive example and negative bags are sets of instances all of which are negative. A number of binary SVM based solutions have been proposed to this problem like the Normalized Set Ker...

متن کامل

IRDDS: Instance reduction based on Distance-based decision surface

In instance-based learning, a training set is given to a classifier for classifying new instances. In practice, not all information in the training set is useful for classifiers. Therefore, it is convenient to discard irrelevant instances from the training set. This process is known as instance reduction, which is an important task for classifiers since through this process the time for classif...

متن کامل

Different Learning Levels in Multiple-choice and Essay Tests: Immediate and Delayed Retention

    This study investigated the effects of different learning levels, including Remember an Instance (RI), Remember a Generality (RG), and Use a Generality (UG) in multiple-choice and essay tests on immediate and delayed retention. Three-hundred pre-intermediate students participated in the study. Reading passages with multiple-choice and essay questions in different levels of learning were giv...

متن کامل

One-Class Multiple Instance Learning and Applications to Target Tracking

Existing work in the field of Multiple Instance Learning (MIL) have only looked at the standard two-class problem assuming both positive and negative bags are available. In this work, we propose the first analysis of the one-class version of MIL problem where one is only provided input data in the form of positive bags. We also propose an SVM-based formulation to solve this problem setting. To ...

متن کامل

Region-Based Image Clustering and Retrieval Using Multiple Instance Learning

Multiple Instance Learning (MIL) is a special kind of supervised learning problem that has been studied actively in recent years. We propose an approach based on One-Class Support Vector Machine (SVM) to solve MIL problem in the region-based Content Based Image Retrieval (CBIR). This is an area where a huge number of image regions are involved. For the sake of efficiency, we adopt a Genetic Alg...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1603.04947  شماره 

صفحات  -

تاریخ انتشار 2016